On the Convergence of Descent

نویسنده

  • M Patriksson
چکیده

Recently, Zhu and Marcotte 15] established the convergence of a modiied descent algorithm for monotone variational inequalities. Using algorithmic equivalence results due to Patriksson 10, 11] and Larsson and Patriksson 7], we show that this convergence result may be used to establish the convergence of slightly modiied versions of the classical successive approximation algorithms of Dafermos 2] and Cohen 1], and of the descent algorithms of Wu et al. 14], Patriksson 10, 11], and Larsson and Patriksson 7], under assumptions that are both much milder and much easier to verify than those for their original statements.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A new Levenberg-Marquardt approach based on Conjugate gradient structure for solving absolute value equations

In this paper, we present a new approach for solving absolute value equation (AVE) whichuse Levenberg-Marquardt method with conjugate subgradient structure. In conjugate subgradientmethods the new direction obtain by combining steepest descent direction and the previous di-rection which may not lead to good numerical results. Therefore, we replace the steepest descentdir...

متن کامل

On the convergence speed of artificial neural networks in‎ ‎the solving of linear ‎systems

‎Artificial neural networks have the advantages such as learning, ‎adaptation‎, ‎fault-tolerance‎, ‎parallelism and generalization‎. ‎This ‎paper is a scrutiny on the application of diverse learning methods‎ ‎in speed of convergence in neural networks‎. ‎For this aim‎, ‎first we ‎introduce a perceptron method based on artificial neural networks‎ ‎which has been applied for solving a non-singula...

متن کامل

Residual norm steepest descent based iterative algorithms for Sylvester tensor equations

Consider the following consistent Sylvester tensor equation[mathscr{X}times_1 A +mathscr{X}times_2 B+mathscr{X}times_3 C=mathscr{D},]where the matrices $A,B, C$ and the tensor $mathscr{D}$ are given and $mathscr{X}$ is the unknown tensor. The current paper concerns with examining a simple and neat framework for accelerating the speed of convergence of the gradient-based iterative algorithm and ...

متن کامل

Two Settings of the Dai-Liao Parameter Based on Modified Secant Equations

Following the setting of the Dai-Liao (DL) parameter in conjugate gradient (CG) methods‎, ‎we introduce two new parameters based on the modified secant equation proposed by Li et al‎. ‎(Comput‎. ‎Optim‎. ‎Appl‎. ‎202:523-539‎, ‎2007) with two approaches‎, ‎which use an extended new conjugacy condition‎. ‎The first is based on a modified descent three-term search direction‎, ‎as the descent Hest...

متن کامل

Strong convergence of variational inequality problem Over the set of common fixed points of a family of demi-contractive mappings

In this paper, by using the viscosity iterative method and the hybrid steepest-descent method, we present a new algorithm for solving the variational inequality problem. The sequence generated by this algorithm is strong convergence to a common element of the set of common zero points of a finite family of inverse strongly monotone operators and the set of common fixed points of a finite family...

متن کامل

The effect of time interval between descent of the Quran and commentator on the commentary

The Holy Quran was descended from Almighty Allah to the Holy Prophet (AS) about the last 14 centuries and there has been a time interval between descent of the Quran and its existing readers and commentators. This Question is posed whether this time interval between descent of the Quran and commentator will effect on verses understanding and their commentary or not? As one of commentary princip...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1994